Order:
  1.  24
    PIPS: A Parallel Planning Model of Sentence Production.Laurel Brehm, Pyeong Whan Cho, Paul Smolensky & Matthew A. Goldrick - 2022 - Cognitive Science 46 (2):e13079.
    Cognitive Science, Volume 46, Issue 2, February 2022.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  2. Fractal Analysis Illuminates the Form of Connectionist Structural Gradualness.Whitney Tabor, Pyeong Whan Cho & Emily Szkudlarek - 2013 - Topics in Cognitive Science 5 (3):634-667.
    We examine two connectionist networks—a fractal learning neural network (FLNN) and a Simple Recurrent Network (SRN)—that are trained to process center-embedded symbol sequences. Previous work provides evidence that connectionist networks trained on infinite-state languages tend to form fractal encodings. Most such work focuses on simple counting recursion cases (e.g., anbn), which are not comparable to the complex recursive patterns seen in natural language syntax. Here, we consider exponential state growth cases (including mirror recursion), describe a new training scheme that seems (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  3.  41
    Discovery of a Recursive Principle: An Artificial Grammar Investigation of Human Learning of a Counting Recursion Language.Pyeong Whan Cho, Emily Szkudlarek & Whitney Tabor - 2016 - Frontiers in Psychology 7.
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark